Skip to content

Conversation

bradquarry
Copy link

Many customers want to use our OpenAI Inference endpoint against OpenAI compatible API's they have written, or Ollama, or Nvidia Triton OpenAI API front end. I had heard that was the intent of this OpenAI inference endpoint, but we do not state it directly. Can we validate this is OK with Search PM and include this?

  • Have you signed the contributor license agreement?
  • Have you followed the contributor guidelines?
  • If submitting code, have you built your formula locally prior to submission with gradle check?
  • If submitting code, is your pull request against main? Unless there is a good reason otherwise, we prefer pull requests against main and will backport as needed.
  • If submitting code, have you checked that your submission is for an OS and architecture that we support?
  • If you are submitting this code for a class then read our policy for that.

Many customers want to use our OpenAI Inference endpoint against OpenAI compatible API's they have written, or Ollama, or Nvidia Triton OpenAI API front end. I had heard that was the intent of this OpenAI inference endpoint, but we do not state it directly. Can we validate this is OK with Search PM and include this?
Copy link
Contributor

Documentation preview:

@elasticsearchmachine elasticsearchmachine added needs:triage Requires assignment of a team area label v8.17.5 labels Mar 21, 2025
@elasticsearchmachine
Copy link
Collaborator

@bradquarry please enable the option "Allow edits and access to secrets by maintainers" on your PR. For more information, see the documentation.

@elasticsearchmachine elasticsearchmachine added the external-contributor Pull request authored by a developer outside the Elasticsearch team label Mar 21, 2025
@bradquarry
Copy link
Author

I don't see any button to allow edits by maintainers unfortunately.

@timto-elastic
Copy link

@szabosteve I assigned this to you. This probably needs to be updated in the v9 docs as well. @serenachou @aznick are you ok with this change?

@PeteGillinElastic PeteGillinElastic added >docs General docs changes and removed needs:triage Requires assignment of a team area label labels Mar 21, 2025
@elasticsearchmachine elasticsearchmachine added the Team:Docs Meta label for docs team label Mar 21, 2025
@elasticsearchmachine
Copy link
Collaborator

Pinging @elastic/es-docs (Team:Docs)

@szabosteve
Copy link
Contributor

Confirmed that this change is okay. Refer to this Slack thread for further info.

Copy link
Contributor

@szabosteve szabosteve left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thank you!

@szabosteve szabosteve added auto-backport Automatically create backport pull requests when merged v8.18.1 v8.19.0 labels Mar 24, 2025
@szabosteve szabosteve merged commit 6800df7 into 8.17 Mar 24, 2025
6 checks passed
@szabosteve szabosteve deleted the bradquarry-patch-1 branch March 24, 2025 11:23
@elasticsearchmachine
Copy link
Collaborator

💔 Backport failed

You can use sqren/backport to manually backport by running backport --upstream elastic/elasticsearch --pr 125419

szabosteve added a commit to szabosteve/elasticsearch that referenced this pull request Mar 24, 2025
Many customers want to use our OpenAI Inference endpoint against OpenAI compatible API's they have written, or Ollama, or Nvidia Triton OpenAI API front end. I had heard that was the intent of this OpenAI inference endpoint, but we do not state it directly. Can we validate this is OK with Search PM and include this?

Co-authored-by: István Zoltán Szabó <[email protected]>
szabosteve added a commit that referenced this pull request Mar 24, 2025
* Update service-openai.asciidoc (#125419)

Many customers want to use our OpenAI Inference endpoint against OpenAI compatible API's they have written, or Ollama, or Nvidia Triton OpenAI API front end. I had heard that was the intent of this OpenAI inference endpoint, but we do not state it directly. Can we validate this is OK with Search PM and include this?

Co-authored-by: István Zoltán Szabó <[email protected]>

* Update docs/reference/inference/service-openai.asciidoc

---------

Co-authored-by: Brad Quarry <[email protected]>
szabosteve added a commit that referenced this pull request Mar 24, 2025
* Update service-openai.asciidoc (#125419)

Many customers want to use our OpenAI Inference endpoint against OpenAI compatible API's they have written, or Ollama, or Nvidia Triton OpenAI API front end. I had heard that was the intent of this OpenAI inference endpoint, but we do not state it directly. Can we validate this is OK with Search PM and include this?

Co-authored-by: István Zoltán Szabó <[email protected]>

* Update docs/reference/inference/service-openai.asciidoc

---------

Co-authored-by: Brad Quarry <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

auto-backport Automatically create backport pull requests when merged backport pending >docs General docs changes external-contributor Pull request authored by a developer outside the Elasticsearch team Team:Docs Meta label for docs team v8.17.5 v8.18.1 v8.19.0

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants